How many objects can you track ? : Evidence for a 2 resource - limited attentive tracking mechanism

نویسندگان

  • George A. Alvarez
  • Steven L. Franconeri
چکیده

13 Much of our interaction with the visual world requires us to isolate some currently important objects from other less 14 important objects. This task becomes more difficult when objects move, or when our field of view moves relative to the 15 world, requiring us to track these objects over space and time. Previous experiments have shown that observers can track a 16 maximum of about 4 moving objects. A natural explanation for this capacity limit is that the visual system is architecturally 17 limited to handling a fixed number of objects at once, a so-called magical number 4 on visual attention. In contrast to this 18 view, Experiment 1 shows that tracking capacity is not fixed. At slow speeds it is possible to track up to 8 objects, and yet 19 there are fast speeds at which only a single object can be tracked. Experiment 2 suggests that that the limit on tracking is 20 related to the spatial resolution of attention. These findings suggest that the number of objects that can be tracked is 21 primarily set by a flexibly allocated resource, which has important implications for the mechanisms of object tracking and for 22 the relationship between object tracking and other cognitive processes. 23

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

How many objects can you track? Evidence for a resource-limited attentive tracking mechanism.

Much of our interaction with the visual world requires us to isolate some currently important objects from other less important objects. This task becomes more difficult when objects move, or when our field of view moves relative to the world, requiring us to track these objects over space and time. Previous experiments have shown that observers can track a maximum of about 4 moving objects. A ...

متن کامل

Delineating the neural signatures of tracking spatial position and working memory during attentive tracking.

In the attentive tracking task, observers track multiple objects as they move independently and unpredictably among visually identical distractors. Although a number of models of attentive tracking implicate visual working memory as the mechanism responsible for representing target locations, no study has ever directly compared the neural mechanisms of the two tasks. In the current set of exper...

متن کامل

A Novel Method for Tracking Moving Objects using Block-Based Similarity

Extracting and tracking active objects are two major issues in surveillance and monitoring applications such as nuclear reactors, mine security, and traffic controllers. In this paper, a block-based similarity algorithm is proposed in order to detect and track objects in the successive frames. We define similarity and cost functions based on the features of the blocks, leading to less computati...

متن کامل

Visual Learning in Multiple-Object Tracking

BACKGROUND Tracking moving objects in space is important for the maintenance of spatiotemporal continuity in everyday visual tasks. In the laboratory, this ability is tested using the Multiple Object Tracking (MOT) task, where participants track a subset of moving objects with attention over an extended period of time. The ability to track multiple objects with attention is severely limited. Re...

متن کامل

Moving Vehicle Tracking Using Disjoint View Multicameras

Multicamera vehicle tracking is a necessary part of any video-based intelligent transportation system for extracting different traffic parameters such as link travel times and origin/destination counts. In many applications, it is needed to locate traffic cameras disjoint from each other to cover a wide area. This paper presents a method for tracking moving vehicles in such camera networks. The...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007